Web Survey Bibliography
"Dynamic form" is the generic heading for dynamic text fields and dynamic lists, two innovative ways of reactive data collection in self-administered online surveys.
Dynamic forms are considered a
Open-ended questions do not pose limitations on the respondent in choosing an answer. Closed questions often are faster to answer with little mental effort, offer the benefit of ease to standardization, and data gathered from closed-ended questions need little time for coding and lend themselves to statistical analysis. At first glance, _dynamic text fields_ do not differ from ordinary HTML text fields. However, upon beginning with an entry, suggestions for the most probable word are offered in an area below the text field. With each new letter these suggestions are readapted. At http://labs.google.com/suggest Google shows an example for the use of this technique in a search engine.
By using _dynamic lists_, even questions with large numbers of response categories that can be brought into an hierarchical order, can be answered like closed-ended questions. At first, the respondent sees only a single table with very general categories. As soon as one of these categories is selected, more specific choices appear in a second table. Finding the appropriate answer is supported by gradually offering chunks of more detailed descriptions.
Both kinds of dynamic forms are suitable for the measurement of variables more possible values than feasible in tradition al closed-ended questions (e.g. subject of study or classification of occupations).
Dynamic forms provide new ground in online research and have not been examined yet, for example regarding their influence on the quality of data or the cognitive processes underlying the response behavior. We postulate that there is a change from recall to recognition when using dynamic forms instead of open-ended questions. Consequently, number and quality of responses should increase.
ln the experimental panel studies presented, dynamic text fields and lists were compared with radio buttons, drop-down menus and standard text fields. Thereby, the influence of implementing dynamic forms on motivation to participate in a study, response times and efforts needed to code data were examined.
"Dynamische Formulare" ist der Oberbegriff fOr dynamische Textfelder und dynamische Listen, zwei innovative Arten reaktiver Datenerhebung in selbstadministrierten Onlinebefragungen.
Dynamische Formulare werden als eine Web-2.0-Technik angesehen. Wir zeigen hier, dass diese Technik genutzt werden kann, um die Vorteile offener und geschlossener Fragetypen miteinander zu verbinden. Offenen Fragen beschranken die Antwortm6glichkeit des Befragten nicht durch Vorgaben; geschlossenen Fragen lassen sich haufig schneller und mit geringerer kognitiver Beanspruchung beantworten und bieten den Vorteil der einfachen Standardisierung. Zudem k6nnen Daten, die mit geschlossenen Fragen erhoben wurden, schnell fOr statistische Auswertungen vercodet werden.
_Dynamische Textfelder_ sehen auf den ersten Blick genauso aus wie herk6mmliche HTML-Textfelder. Sobaid jedoch mit der Texteingabe begonnen wird, erscheinen in einem Bereich unterhalb des Eingabefeldes Vorschlage, welches Wort gerade wahrscheinlich eingegeben wird. Mit jedem weiteren eingegebenen Zeichen passen sich die Vorschlage an. Auf http://labs.google.com/suggest findet sich ein Beispiel fOr die Anwendung dieser Technik in einer Suchmaschine. _Dynamische Listen_ erm6glichen die geschlossene Erhebung von Items mit einer groflen Zahl von Auspragungen, die sich hierarchisch ordnen lassen. Zunachst sieht der Nutzer nur eine Tabelle mit allgemeinen Kategorien. Sobaid auf der obersten Ebene eine Kategorie gewahlt wird, erscheinen in einer zweiten Tabelle speziellere Auswahlm6glichkeiten. Das Finden der zutreffenden Antwort wird durch das stufenweise Darbieten von Chunks mit detaillierteren Informationen unterstutzt.
Dynamische Formulare sind wissenschaftliches Neuland. Bisher wurde nicht erforscht, ob ihr Einsatz einen Einfluss auf die DatengUte oder die kognitiven Prozesse hat, die dem Antwortverhalten zugrunde liegen. Wir postulieren, dass auf kognitiver Ebene ein Wechsel von Recall zu Recognition stattfindet, wenn dynamische Formulare anstelle von offenen Fragen genutzt werden.
ln den vorgestellten experimentellen Panel-Studien wurden dynamische TextfeIder und Listen mit Radiobuttons, Drop-down-Mens und herk6mmlichen Textfeldern verglichen. Dadurch konnte der Einfluss dynamischer Formulare auf die Teilnahmemotivation, die Datenqualitat, die Responsezeit und den Kodieraufwand analysiert werden .
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.